352 research outputs found

    Precipitation: A Parameter changing climate and modified by climate change

    Get PDF

    On the verification of climate reconstructions

    No full text
    International audienceThe skill of proxy-based reconstructions of Northern hemisphere temperature is reassessed. Using a rigorous verification method, we show that previous estimates of skill exceeding 50% mainly reflect a sampling bias, and that more realistic values vary about 25%. The bias results from the strong trends in the instrumental period, together with the special partitioning into calibration and validation parts. This setting is characterized by very few degrees of freedom and leaves the regression susceptible to nonsense predictors. Basing the new estimates on 100 random resamplings of the instrumental period we avoid the problem of a priori different calibration and validation statistics and obtain robust estimates plus uncertainty. The low verification scores apply to an entire suite of multiproxy regression-based models, including the most recent variants. It is doubtful whether the estimated levels of verifiable predictive power are strong enough to resolve the current debate on the millennial climate

    Processes and modelling

    Get PDF

    Dynamisch-stochastische Vorhersage-Experimente mit Modellen der allgemeinen Zirkulation fĂŒr einen Zeitraum von 10 Tagen bis 100 Jahren

    Get PDF
    Die untorschiedlichen Zeitskalen des Klimasystemes und die Wechselwirkung zwischen den einzelnen Klimakomponenten erzeugen nichtlineare Schwankungen (das sogonannte Klimarauschen), welche die bei den Hochrechnungen mit komplexen numerischen Modellen erwarteten Signale ĂŒberdecken. ZasÂżltzlich zu diesem Rauschen erzeugen die UnzulĂ€nglichkeiten der verwendeten Modelle ebenfalls Störschwankungen, die die nume- rischen Vorhers agen verfĂ€lschen können. In der vorliegenden Arbeit wird untorsucht, inwieweit man das vorhergesagte Signal mit- teis einer statistisch-dynamischen Vorhersagemethode von dem Klimarauschen trennen kann. Gleichzeitig wird diese Methode benutzt, um abzuschdtzen, welchen Beitrag die ModellunzulĂ€nglichkeiten, z. B. die Auflösung der Modelle zu diesem Rauschen liefern. Die verwendete statistisch-dynamische Methode ist die sogenannte Monte-Car1o-Tech- nik, bei der ein Ensemble von Vorhersagen mit verĂ€nderten Anfangsbedingungen inte- griert wird. Eine Spezialform der Monte-Carlo-Technik ist die "lagged average forecasting" Technik, bei der die verĂ€nderten Anfangsbedingungen aus den in regelmĂ€ĂŸi- gen AbstĂ€nden gewonnenen Beobachtungon erzeugt werden. Die dynamisch-statistischen Modellrechnungen erstreckten sich auf drei Zeitbereiche: a von 10 Tagen bis 3 Monaten: In diesem Tnitbereich möchte man mittels dieser Methode eine Vorhersage ĂŒber das aktu- elle TVettergeschehen oder zumindestens eine Vorhersage ĂŒber die zu erwartenden Groß- wetterlagen erhalten. Zusdtzlich erhofft man sich Informationen darĂŒber, wie wahrscheinlich die Vorhersagen sind. Es zeigt sich, daß zwar im Mittel durch die Verwendung der "lagged average forecasting" Methode ein leichter Vorteil gegenĂŒber einer deterministischen Vorhersage erhalten wer- den kann, jedoch wird dieser Vorteil leicht durch eine Erhöhung der Auflösung wettge- macht. Außerdem kommt es vor, daß einzelne Vorhersagen deutlich besser sind als das Ensemble. Es ist auch nicht möglich, von der Streuung der Einzelergebnisse auf die GĂŒte der Gesamtvorhersage zu schließen, wie es theoretische Studien erhoffen lassen. Dieses kann möglicherweise daran liegen, daß die Anfangsdaten, die auf Beobachtungen zu ver- schiedenen Zeitpunkten basieren, zu starke QualitĂ€tsschwankungen aufweisen, was zu einer zusĂ€tzlichen und schlecht kontrollierbaren Dispersion der Vorhersagen fĂŒhrt. . von 1 Monat bisl Jahr: In diesem Zeitbereich möchte man den Einfluß von anomalen Randbedingungen auf das jahreszeitliche Wettergeschehen vorhersagen. In der vorliegenden Arbeit wird die Auswir- kung einer Meeresoberfl Ă€chentemperaturanomalie auf die atmosphĂ€rische Zirkulation untersucht. Mit Hilfe der Monte-Carlo-Methode ist es möglich, ein statistisch signifikan- tes Signal auch in mittleren Breiten zu finden. . von 1 Jahr bis 100 Jahre: In diesem Zeitbereich möchte man u. a. abschĂątzen, welchen Einfluß anthropogene Emis- sionen von Spurenstoffen in die AtmosphĂ„ire auf die Klimaentwicklung haben. In dieser Zeitskala leisten interne Schwankungen der Ozeane einen wesentlichen Beitrag zum Kli- marauschen und sind durchaus in der Lage, Änderungen des Klimas, die durch anthropo- gene EinflĂŒsse hervorgerufen werden, zu ĂŒberdecken. In der vorliegenden Studie kann durch die Anwendung der Monte-Carlo-Methode gezeigt werden, daß man durch Ensem- ble Mittelung der Monte-Carlo-Experimente ein deutliches KlimaĂ€nderungssignal erhal- ten kann. ZusĂ€tzlich erhĂ€lt man durch diese Methode eine AbschĂ€tzung darĂŒber, in welchen GrĂ¶ĂŸen eine KlimaĂ€nderung am ehesten zu erkennen ist, und welche Parameter stark durch Rauschen ĂŒberlagert werden. Dadurch liefert diese Methode wertvolle Hin- weise darĂŒber, welche beobachteten KlimagrĂ¶ĂŸen am sinnvollsten fĂŒr die Erkennung der anthropogenen KlimaĂ€nderung herangezogen werden sollten. Die Ergebnisse rechtfertigen den fĂŒr diese statistisch-dynamischen Vorhersagen nötigen großen Rechenaufwand, denn diese Methode liefert wertvolle Zusatzinformationen, die mit einer einzelnen deterministischen Hochrechnung nicht zu erhalten wĂ€ren

    Ein globales gekoppeltes Ozean-AtmosphÀren Modell

    Get PDF

    Comparison of Proxy and Multimodel Ensemble Means

    Get PDF
    Proxy‐model comparisons show large discrepancies in the impact of volcanic aerosols on the hydrology of the Asian monsoon region (AMR). This was mostly imputed to uncertainties arising from the use of a single model in previous studies. Here we compare two groups of CMIP5 multimodel ensemble mean (MMEM) with the tree‐ring‐based reconstruction Monsoon Asia Drought Atlas (MADA PDSI), to examine their reliability in reproducing the hydrological effects of the volcanic eruptions in 1300–1850 CE. Time series plots indicate that the MADA PDSI and the MMEMs agree on the significant drying effect of volcanic perturbation over the monsoon‐dominated subregion, while disparities exist over the westerlies‐dominated subregion. Comparisons of the spatial patterns suggest that the MADA PDSI and the MMEMs show better agreement 1 year after the volcanic eruption than in the eruption year and in subregions where more tree‐ring chronologies are available. The MADA PDSI and the CMIP5 MMEMs agree on the drying effect of volcanic eruptions in western‐East Asia, South Asian summer monsoon, and northern East Asian summer monsoon (EASM) regions. Model results suggest significant wetting effect in southern EASM and western‐South Asia, which agrees with the observed hydrological response to the 1991 Mount Pinatubo eruption. Analysis on model output from the Last Millennium Ensemble project shows similar hydrological responses. These results suggest that the CMIP5 MMEM is able to reproduce the impact of volcanic eruptions on the hydrology of the southern AMR

    Process-oriented statistical-dynamical evaluation of LM precipitation forecasts

    Get PDF
    International audienceThe objective of this study is the scale dependent evaluation of precipitation forecasts of the Lokal-Modell (LM) from the German Weather Service in relation to dynamical and cloud parameters. For this purpose the newly designed Dynamic State Index (DSI) is correlated with clouds and precipitation. The DSI quantitatively describes the deviation and relative distance from a stationary and adiabatic solution of the primitive equations. A case study and statistical analysis of clouds and precipitation demonstrates the availability of the DSI as a dynamical threshold parameter. This confirms the importance of imbalances of the atmospheric flow field, which dynamically induce the generation of rainfall

    A climate change simulation starting from 1935

    Get PDF
    Due to restrictions in the available computing resources and a lack of suitable observational data, transient climate change experiments with global coupled ocean-atmosphere models have been started from an initial state at equilibrium with the present day forcing. The historical development of greenhouse gas forcing from the onset of industrialization until the present has therefore been neglected. Studies with simplified models have shown that this "cold start" error leads to a serious underestimation of the anthropogenic global warming. In the present study, a 150-year integration has been carried out with a global coupled ocean-atmosphere model starting from the greenhouse gas concentration observed in 1935, i.e., at an early time of industrialization. The model was forced with observed greenhouse gas concentrations up to 1985, and with the equivalent C02 concentrations stipulated in Scenario A ("Business as Usual") of the Intergovernmental Panel on Climate Change from 1985 to 2085. The early starting date alleviates some of the cold start problems. The global mean near surface temperature change in 2085 is about 0.3 K (ca. 10) higher in the early industrialization experiment than in an integration with the same model and identical Scenario A greenhouse gas forcing, but with a start date in 1985. Comparisons between the experiments with early and late start dates show considerable differences in the amplitude of the regional climate change patterns, particularly for sea level. The early industrialization experiment can be used to obtain a first estimate of the detection time for a greenhouse-gas-induced near-surface temperature signal. Detection time estimates are obtained using globally and zonally averaged data from the experiment and a long control run, as well as principal component time series describing the evolution of the dominant signal and noise modes. The latter approach yields the earliest detection time (in the decade 1990-2000) for the time-evolving near-surface temperature signal. For global-mean temperatures or for temperatures averaged between 45°N and 45°S, the signal detection times are in the decades 2015-2025 and 2005-2015, respectively. The reduction of the "cold start" error in the early industrialization experiment makes it possible to separate the near-surface temperature signal from the noise about one decade earlier than in the experiment starting in 1985. We stress that these detection times are only valid in the context of the coupled model's internally-generated natural variability, which possibly underestimates low frequency fluctuations and does not incorporate the variance associated with changes in external forcing factors, such as anthropogenic sulfate aerosols, solar variability or volcanic dust. © 1995 Springer-Verlag

    Ocean variability and its influence on the detectability of greenhouse warming signals

    Get PDF
    Recent investigations have considered whether it is possible to achieve early detection of greenhouse-gas-induced climate change by observing changes in ocean variables. In this study we use model data to assess some of the uncertainties involved in estimating when we could expect to detect ocean greenhouse warming signals. We distinguish between detection periods and detection times. As defined here, detection period is the length of a climate time series required in order to detect, at some prescribed significance level, a given linear trend in the presence of the natural climate variability. Detection period is defined in model years and is independent of reference time and the real time evolution of the signal. Detection time is computed for an actual time-evolving signal from a greenhouse warming experiment and depends on the experiment's start date. Two sources of uncertainty are considered: those associated with the level of natural variability or noise, and those associated with the time-evolving signals. We analyze the ocean signal and noise for spatially averaged ocean circulation indices such as heat and fresh water fluxes, rate of deep water formation, salinity, temperature, transport of mass, and ice volume. The signals for these quantities are taken from recent time-dependent greenhouse warming experiments performed by the Max Planck Institute for Meteorology in Hamburg with a coupled ocean-atmosphere general circulation model. The time-dependent greenhouse gas increase in these experiments was specified in accordance with scenario A of the Intergovernmental Panel on Climate Change. The natural variability noise is derived from a 300-year control run performed with the same coupled atmosphere-ocean model and from two long (>3000 years) stochastic forcing experiments in which an uncoupled ocean model was forced by white noise surface flux variations. In the first experiment the stochastic forcing was restricted to the fresh water fluxes, while in the second experiment the ocean model was additionally forced by variations in wind stress and heat fluxes. The mean states and ocean variability are very different in the three natural variability integrations. A suite of greenhouse warming simulations with identical forcing but different initial conditions reveals that the signal estimated from these experiments may evolve in noticeably different ways for some ocean variables. The combined signal and noise uncertainties translate into large uncertainties in estimates of detection time. Nevertheless, we find that ocean variables that are highly sensitive indicators of surface conditions, such as convective overturning in the North Atlantic, have shorter signal detection times (35?65 years) than deep-ocean indicators (≄100 years). We investigate also whether the use of a multivariate detection vector increases the probability of early detection. We find that this can yield detection times of 35?60 years (relative to a 1985 reference date) if signal and noise are projected onto a common ?fingerprint? which describes the expected signal direction. Optimization of the signal-to-noise ratio by (spatial) rotation of the fingerprint in the direction of low-noise components of the stochastic forcing experiments noticeably reduces the detection time (to 10?45 years). However, rotation in space alone does not guarantee an improvement of the signal-to-noise ratio for a time-dependent signal. This requires an ?optimal fingerprint? strategy in which the detection pattern (fingerprint) is rotated in both space and time
    • 

    corecore